Goto

Collaborating Authors

 bayesian quadrature


697200c9d1710c2799720b660abd11bb-Paper-Conference.pdf

Neural Information Processing Systems

Bayesian model evidence gives a clear criteria for such model selection. However, computing model evidence requires integration over the likelihood, which is challenging, particularly when the likelihood is non-closed-form and/or expensive.


Bayesian Quadrature: Gaussian Processes for Integration

Mahsereci, Maren, Karvonen, Toni

arXiv.org Machine Learning

Bayesian quadrature is a probabilistic, model-based approach to numerical integration, the estimation of intractable integrals, or expectations. Although Bayesian quadrature was popularised already in the 1980s, no systematic and comprehensive treatment has been published. The purpose of this survey is to fill this gap. We review the mathematical foundations of Bayesian quadrature from different points of view; present a systematic taxonomy for classifying different Bayesian quadrature methods along the three axes of modelling, inference, and sampling; collect general theoretical guarantees; and provide a controlled numerical study that explores and illustrates the effect of different choices along the axes of the taxonomy. We also provide a realistic assessment of practical challenges and limitations to application of Bayesian quadrature methods and include an up-to-date and nearly exhaustive bibliography that covers not only machine learning and statistics literature but all areas of mathematics and engineering in which Bayesian quadrature or equivalent methods have seen use.






VariationalBayesianMonteCarlo withNoisyLikelihoods

Neural Information Processing Systems

Intheoriginalformulation, observations are assumed to be exact (non-noisy), so the GP likelihood only included a small observation noise σ2obs for numerical stability [32].


BayesSum: Bayesian Quadrature in Discrete Spaces

Kang, Sophia Seulkee, Briol, François-Xavier, Karvonen, Toni, Chen, Zonghao

arXiv.org Machine Learning

This paper addresses the challenging computational problem of estimating intractable expectations over discrete domains. Existing approaches, including Monte Carlo and Russian Roulette estimators, are consistent but often require a large number of samples to achieve accurate results. We propose a novel estimator, \emph{BayesSum}, which is an extension of Bayesian quadrature to discrete domains. It is more sample efficient than alternatives due to its ability to make use of prior information about the integrand through a Gaussian process. We show this through theory, deriving a convergence rate significantly faster than Monte Carlo in a broad range of settings. We also demonstrate empirically that our proposed method does indeed require fewer samples on several synthetic settings as well as for parameter estimation for Conway-Maxwell-Poisson and Potts models.


Variational Bayesian Monte Carlo

Luigi Acerbi

Neural Information Processing Systems

We introduce here a novel sample-efficient inference framework, V ariational Bayesian Monte Carlo (VBMC). VBMC combines variational inference with Gaussian-process based, active-sampling Bayesian quadrature, using the latter to efficiently approximate the intractable integral in the variational objective.


Export Reviews, Discussions, Author Feedback and Meta-Reviews

Neural Information Processing Systems

First provide a summary of the paper, and then address the following criteria: Quality, clarity, originality and significance. Overview: this paper presents a fast alternative to MC methods for approximating intractable integrals. The main idea behind Bayesian quadrature is to exploit assumptions and regularities in the likelihood surface, something which pure Monte Carlo ignores. Samples are then drawn according to some criterion - in this case, samples are chosen to the location of the maximal expected posterior variance of the integrand. Intuitively, this is a location where the model knows the least about the value of the integrand, and stands to gain a lot of information.